Blog posts tagged with 'choosing seo as your career'

RSS
Dynamic URLs vs. Static URLs - Tuesday, September 06, 2011

The Issue at Hand
Websites that utilize databases which can insert content into a webpage by way of a dynamic script like PHP or JavaScript are increasingly popular. This type of site is considered dynamic. Many websites choose dynamic content over static content. This is because if a website has thousands of products or pages, writing or updating each static by hand is a monumental task.

There are two types of URLs: dynamic and static. A dynamic URL is a page address that results from the search of a database-driven web site or the URL of a web site that runs a script. In contrast to static URLs, in which the contents of the web page stay the same unless the changes are hard-coded into the HTML, dynamic URLs are generated from specific queries to a site's database. The dynamic page is basically only a template in which to display the results of the database query. Instead of changing information in the HTML code, the data is changed in the database.

But there is a risk when using dynamic URLs: search engines don't like them. For those at most risk of losing search engine positioning due to dynamic URLs are e-commerce stores, forums, sites utilizing content management systems and blogs like Mambo or WordPress, or any other database-driven website. Many times the URL that is generated for the content in a dynamic site looks something like this:

   http://www.somesites.com/forums/thread.php?threadid=12345&sort=date

A static URL on the other hand, is a URL that doesn't change, and doesn't have variable strings. It looks like this:

   http://www.somesites.com/forums/the-challenges-of-dynamic-urls.htm

Static URLs are typically ranked better in search engine results pages, and they are indexed more quickly than dynamic URLs, if dynamic URLs get indexed at all. Static URLs are also easier for the end-user to view and understand what the page is about. If a user sees a URL in a search engine query that matches the title and description, they are more likely to click on that URL than one that doesn't make sense to them.

A search engine wants to only list pages its index that are unique. Search engines decide to combat this issue by cutting off the URLs after a specific number of variable strings (e.g.: ? & =).

For example, let's look at three URLs:

   http://www.somesites.com/forums/thread.php?threadid=12345&sort=date
   http://www.somesites.com/forums/thread.php?threadid=67890&sort=date
   http://www.somesites.com/forums/thread.php?threadid=13579&sort=date

All three of these URLs point to three different pages. But if the search engine purges the information after the first offending character, the question mark (?), now all three pages look the same:

   http://www.somesites.com/forums/thread.php
   http://www.somesites.com/forums/thread.php
   http://www.somesites.com/forums/thread.php

Now, you don't have unique pages, and consequently, the duplicate URLs won't be indexed.

Another issue is that dynamic pages generally do not have any keywords in the URL. It is very important to have keyword rich URLs. Highly relevant keywords should appear in the domain name or the page URL. This became clear in a recent study on how the top three search engines, Google, Yahoo, and MSN, rank websites.

The study involved taking hundreds of highly competitive keyword queries, like travel, cars, and computer software, and comparing factors involving the top ten results. The statistics show that of those top ten, Google has 40-50% of those with the keyword either in the URL or the domain; Yahoo shows 60%; and MSN has an astonishing 85%! What that means is that to these search engines, having your keywords in your URL or domain name could mean the difference between a top ten ranking, and a ranking far down in the results pages.

The Solution
So what can you do about this difficult problem? You certainly don't want to have to go back and recode every single dynamic URL into a static URL. This would be too much work for any website owner.

If you are hosted on a Linux server, then you will want to make the most of the Apache Mod Rewrite Rule, which is gives you the ability to inconspicuously redirect one URL to another, without the user's (or a search engine's) knowledge. You will need to have this module installed in Apache; for more information, you can view the documentation for this module here. This module saves you from having to rewrite your static URLs manually.

How does this module work? When a request comes in to a server for the new static URL, the Apache module redirects the URL internally to the old, dynamic URL, while still looking like the new static URL. The web server compares the URL requested by the client with the search pattern in the individual rules.

For example, when someone requests this URL:
   http://www.somesites.com/forums/the-challenges-of-dynamic-urls.html

The server looks for and compares this static-looking URL to what information is listed in the .htaccess file, such as:

   RewriteEngine on
   RewriteRule thread-threadid-(.*)\.htm$ thread.php?threadid=$1

It then converts the static URL to the old dynamic URL that looks like this, with no one the wiser:
   http://www.somesites.com/forums/thread.php?threadid=12345

You now have a URL that only will rank better in the search engines, but your end-users can definitely understand by glancing at the URL what the page will be about, while allowing Apache's Mod Rewrite Rule to handle to conversion for you, and still keeping the dynamic URL.

If you are not particularly technical, you may not wish to attempt to figure out the complex Mod Rewrite code and how to use it, or you simply may not have the time to embark upon a new learning curve. Therefore, it would be extremely beneficial to have something to do it for you. This URL Rewriting Tool can definitely help you. What this tool does is implement the Mod Rewrite Rule in your .htaccess file to secretly convert a URL to another, such as with dynamic and static ones.

With the URL Rewriting Tool, you can opt to rewrite single pages or entire directories. Simply enter the URL into the box, press submit, and copy and paste the generated code into your .htaccess file on the root of your website. You must remember to place any additional rewrite commands in your .htaccess file for each dynamic URL you want Apache to rewrite. Now, you can give out the static URL links on your website without having to alter all of your dynamic URLs manually because you are letting the Mod Rewrite Rule do the conversion for you, without JavaScript, cloaking, or any sneaky tactics.

Another thing you must remember to do is to change all of your links in your website to the static URLs in order to avoid penalties by search engines due to having duplicate URLs. You could even add your dynamic URLs to your Robots Exclusion Standard File (robots.txt) to keep the search engines from spidering the duplicate URLs. Regardless of your methods, after using the URL Rewrite Tool, you should ideally have no links pointing to any of your old dynamic URLs.

You have multiple reasons to utilize static URLs in your website whenever possible. When it's not possible, and you need to keep your database-driven content as those old dynamic URLs, you can still give end-users and search engine a static URL to navigate, and all the while, they are still your dynamic URLs in disguise. When a search engine engineer was asked if this method was considered "cloaking", he responded that it indeed was not, and that in fact, search engines prefer you do it this way. The URL Rewrite Tool not only saves you time and energy by helping you use static URLs by converting them transparently to your dynamic URLs, but it will also save your rankings in the search engines.

Comments (0)
The Importance of Backlinks - Tuesday, September 06, 2011

If you've read anything about or studied Search Engine Optimization, you've come across the term "backlink" at least once. For those of you new to SEO, you may be wondering what a backlink is, and why they are important. Backlinks have become so important to the scope of Search Engine Optimization, that they have become some of the main building blocks to good SEO. In this article, we will explain to you what a backlink is, why they are important, and what you can do to help gain them while avoiding getting into trouble with the Search Engines.

What are "backlinks"? Backlinks are links that are directed towards your website. Also knows as Inbound links (IBL's). The number of backlinks is an indication of the popularity or importance of that website. Backlinks are important for SEO because some search engines, especially Google, will give more credit to websites that have a good number of quality backlinks, and consider those websites more relevant than others in their results pages for a search query.

When search engines calculate the relevance of a site to a keyword, they consider the number of QUALITY inbound links to that site. So we should not be satisfied with merely getting inbound links, it is the quality of the inbound link that matters.
A search engine considers the content of the sites to determine the QUALITY of a link. When inbound links to your site come from other sites, and those sites have content related to your site, these inbound links are considered more relevant to your site. If inbound links are found on sites with unrelated content, they are considered less relevant. The higher the relevance of inbound links, the greater their quality.

For example, if a webmaster has a website about how to rescue orphaned kittens, and received a backlink from another website about kittens, then that would be more relevant in a search engine's assessment than say a link from a site about car racing. The more relevant the site is that is linking back to your website, the better the quality of the backlink.

Search engines want websites to have a level playing field, and look for natural links built slowly over time. While it is fairly easy to manipulate links on a web page to try to achieve a higher ranking, it is a lot harder to influence a search engine with external backlinks from other websites. This is also a reason why backlinks factor in so highly into a search engine's algorithm. Lately, however, a search engine's criteria for quality inbound links has gotten even tougher, thanks to unscrupulous webmasters trying to achieve these inbound links by deceptive or sneaky techniques, such as with hidden links, or automatically generated pages whose sole purpose is to provide inbound links to websites. These pages are called link farms, and they are not only disregarded by search engines, but linking to a link farm could get your site banned entirely.

Another reason to achieve quality backlinks is to entice visitors to come to your website. You can't build a website, and then expect that people will find your website without pointing the way. You will probably have to get the word out there about your site. One way webmasters got the word out used to be through reciprocal linking. Let's talk about reciprocal linking for a moment.

There is much discussion in these last few months about reciprocal linking. In the last Google update, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings with the sheer number of inbound links. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant inbound link was ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.

We must be careful with our reciprocal links. There is a Google patent in the works that will deal with not only the popularity of the sites being linked to, but also how trustworthy a site is that you link to from your own website. This will mean that you could get into trouble with the search engine just for linking to a bad apple. We could begin preparing for this future change in the search engine algorithm by being choosier with which we exchange links right now. By choosing only relevant sites to link with, and sites that don't have tons of outbound links on a page, or sites that don't practice black-hat SEO techniques, we will have a better chance that our reciprocal links won't be discounted.

Many webmasters have more than one website. Sometimes these websites are related, sometimes they are not. You have to also be careful about interlinking multiple websites on the same IP. If you own seven related websites, then a link to each of those websites on a page could hurt you, as it may look like to a search engine that you are trying to do something fishy. Many webmasters have tried to manipulate backlinks in this way; and too many links to sites with the same IP address is referred to as backlink bombing.

One thing is certain: interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you.

There are a few things to consider when beginning your backlink building campaign. It is helpful to keep track of your backlinks, to know which sites are linking back to you, and how the anchor text of the backlink incorporates keywords relating to your site. A tool to help you keep track of your backlinks is the Domain Stats Tool. This tool displays the backlinks of a domain in Google, Yahoo, and MSN. It will also tell you a few other details about your website, like your listings in the Open Directory, or DMOZ, from which Google regards backlinks highly important; Alexa traffic rank, and how many pages from your site that have been indexed, to name just a few.

Another tool to help you with your link building campaign is the Backlink Builder Tool. It is not enough just to have a large number of inbound links pointing to your site. Rather, you need to have a large number of QUALITY inbound links. This tool searches for websites that have a related theme to your website which are likely to add your link to their website. You specify a particular keyword or keyword phrase, and then the tool seeks out related sites for you. This helps to simplify your backlink building efforts by helping you create quality, relevant backlinks to your site, and making the job easier in the process.

There is another way to gain quality backlinks to your site, in addition to related site themes: anchor text. When a link incorporates a keyword into the text of the hyperlink, we call this quality anchor text. A link's anchor text may be one of the under-estimated resources a webmaster has. Instead of using words like "click here" which probably won't relate in any way to your website, using the words "Please visit our tips page for how to nurse an orphaned kitten" is a far better way to utilize a hyperlink. A good tool for helping you find your backlinks and what text is being used to link to your site is the Backlink Anchor Text Analysis Tool. If you find that your site is being linked to from another website, but the anchor text is not being utilized properly, you should request that the website change the anchor text to something incorporating relevant keywords. This will also help boost your quality backlinks score.

Building quality backlinks is extremely important to Search Engine Optimization, and because of their importance, it should be very high on your priority list in your SEO efforts. We hope you have a better understanding of why you need good quality inbound links to your site, and have a handle on a few helpful tools to gain those links.

Comments (0)
Optimization, Over-Optimization or SEO Overkill? - Tuesday, September 06, 2011

The fight to top search engines' results knows no limits – neither ethical, nor technical. There are often reports of sites that have been temporarily or permanently excluded from Google and the other search engines because of malpractice and using “black hat” SEO optimization techniques. The reaction of search engines is easy to understand – with so many tricks and cheats that SEO experts include in their arsenal, the relevancy of returned results is seriously compromised to the point where search engines start to deliver completely irrelevant and manipulated search results. And even if search engines do not discover your scams right away, your competitors might report you.

Keyword Density or Keyword Stuffing?

Sometimes SEO experts go too far in their desire to push their clients' sites to top positions and resort to questionable practices, like keyword stuffing. Keyword stuffing is considered an unethical practice because what you actually do is use the keyword in question throughout the text suspiciously often. Having in mind that the recommended keyword density is from 3 to 7%, anything above this, say 10% density starts to look very much like keyword stuffing and it is likely that will not get unnoticed by search engines. A text with 10% keyword density can hardly make sense, if read by a human. Some time ago Google implemented the so called “Florida Update” and essentially imposed a penalty for pages that are keyword-stuffed and over-optimized in general.

Generally, keyword density in the title, the headings, and the first paragraphs matters more. Needless to say that you should be especially careful not to stuff these areas. Try the Keyword Density Cloud tool to check if your keyword density is in the acceptable limits, especially in the above-mentioned places. If you have a high density percentage for a frequently used keyword, then consider replacing some of the occurrences of the keyword with synonyms. Also, generally words that are in bold and/or italic are considered important by search engines but if any occurrence of the target keywords is in bold and italic, this also looks unnatural and in the best case it will not push your page up.

Doorway Pages and Hidden Text

Another common keyword scam is doorway pages. Before Google introduced the PageRank algorithm, doorways were a common practice and there were times when they were not considered an illegal optimization. A doorway page is a page that is made especially for the search engines and that has no meaning for humans but is used to get high positions in search engines and to trick users to come to the site. Although keywords are still very important, today keywords alone have less effect in determining the position of a site in search results, so doorway pages do not get so much traffic to a site but if you use them, don't ask why Google punished you.

Very similar to doorway pages was a scam called hidden text. This is text, which is invisible to humans (e.g. the text color is the same as the page background) but is included in the HTML source of the page, trying to fool search engines that the particular page is keyword-rich. Needless to say, both doorway pages and hidden text can hardly be qualified as optimization techniques, there are more manipulation than everything else.

Duplicate Content

It is a basic SEO rule that content is king. But not duplicate content. In terms of Google, duplicate content means text that is the same as the text on a different page on the SAME site (or on a sister-site, or on a site that is heavily linked to the site in question and it can be presumed that the two sites are related) – i.e. when you copy and paste the same paragraphs from one page on your site to another, then you might expect to see your site's rank drop. Most SEO experts believe that syndicated content is not treated as duplicate content and there are many examples of this. If syndicated content were duplicate content, that the sites of news agencies would have been the first to drop out of search results. Still, it does not hurt to check from time if your site has duplicate content with another, at least because somebody might be illegally copying your content and you do not know. The Similar Page Checker tool will help you see if you have grounds to worry about duplicate content.

Links Spam

Links are another major SEO tool and like the other SEO tools it can be used or misused. While backlinks are certainly important (for Yahoo backlinks are important as quantity, while for Google it is more important what sites backlinks come from), getting tons of backlinks from a link farm or a blacklisted site is begging to be penalized. Also, if outbound links (links from your site to other sites) considerably outnumber your inbound links (links from other sites to your site), then you have put too much effort in creating useless links because this will not improve your ranking. You can use the Domain Stats Tool to see the number of backlinks (inbound links) to your site and the Site Link Analyzer to see how many outbound links you have.

Using keywords in links (the anchor text), domain names, folder and file names does boost your search engine rankings but again, the precise measure is the boundary between topping the search results and being kicked out of them. For instance, if you are optimizing for the keyword “cat”, which is a frequently chosen keyword and as with all popular keywords and phrases, competition is fierce, you might not see other alternative for reaching the top but getting a domain name like http://www.cat-cats-kittens-kitty.com, which no doubt is packed with keywords to the maximum but is first – difficult to remember, and second – if the contents does not correspond to the plenitude of cats in the domain name, you will never top the search results.

Although file and folder names are less important than domain names, now and then (but definitely not all the time) you can include “cat” (and synonyms) in them and in the anchor text of the links. This counts well, provided that anchors are not artificially stuffed (for instance if you use “cat_cats_kitten” as anchor for internal site links this anchor certainly is stuffed). While you have no control over third sides that link to you and use anchors that you don't like, it is up to you to perform periodic checks what anchors do other sites use to link to you. A handy tool for this task is the Backlink Anchor Text Analysis, where you enter the URL and get a listing of the sites that link to you and the anchor text they use.

Finally, to Google and the other search engines it makes no difference if a site is intentionally over-optimized to cheat them or over-optimization is the result of good intentions, so no matter what your motives are, always try to keep to reasonable practices and remember that do not overstep the line.

Comments (0)
What is Robots.txt - Tuesday, September 06, 2011

Robots.txt

It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages (although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.

One way to tell search engines which files and folders on your Web site to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed. A better way to inform search engines about your will is to use a robots.txt file.

What Is Robots.txt?

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.

The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.

The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.

Structure of a Robots.txt File

The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:

User-agent:

Disallow:

User-agent” are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:

# All user agents are disallowed to see the /temp directory.

User-agent: *

Disallow: /temp/

The Traps of a Robots.txt File

When you start making complicated files – i.e. you decide to allow different user agents access to different directories – problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.

The more serious problem is with logical errors. For instance:

User-agent: *

Disallow: /temp/

User-agent: Googlebot

Disallow: /images/

Disallow: /temp/

Disallow: /cgi-bin/

The above example is from a robots.txt that allows all agents to access everything on the site except the /temp directory. Up to here it is fine but later on there is another record that specifies more restrictive terms for Googlebot. When Googlebot starts reading robots.txt, it will see that all user agents (including Googlebot itself) are allowed to all folders except /temp/. This is enough for Googlebot to know, so it will not read the file to the end and will index everything except /temp/ - including /images/ and /cgi-bin/, which you think you have told it not to touch. You see, the structure of a robots.txt file is simple but still serious mistakes can be made easily.

Tools to Generate and Validate a Robots.txt File

Having in mind the simple syntax of a robots.txt file, you can always read it to see if everything is OK but it is much easier to use a validator, like this one: http://tool.motoricerca.info/robots-checker.phtml. These tools report about common mistakes like missing slashes or colons, which if not detected compromise your efforts. For instance, if you have typed:

User agent: *

Disallow: /temp/

this is wrong because there is no slash between “user” and “agent” and the syntax is incorrect.

In those cases, when you have a complex robots.txt file – i.e. you give different instructions to different user agents or you have a long list of directories and subdirectories to exclude, writing the file manually can be a real pain. But do not worry – there are tools that will generate the file for you. What is more, there are visual tools that allow to point and select which files and folders are to be excluded. But even if you do not feel like buying a graphical tool for robots.txt generation, there are online tools to assist you. For instance, the Server-Side Robots Generator offers a dropdown list of user agents and a text box for you to list the files you don't want indexed. Honestly, it is not much of a help, unless you want to set specific rules for different search engines because in any case it is up to you to type the list of directories but is more than nothing.

Comments (0)
Your Website from Google Banned to Google Unbanned - Tuesday, September 06, 2011

Even if you are not looking for trouble and do not violate any known Google SEO rule, you still might have to experience the ultimate SEO nightmare - being excluded from Google’s index. Although Google is a kind of a monopolist among search engines, it is not a bully company that excludes innocent victims for pure pleasure. Google keeps rigorously to SEO best practices and excludes sites that misbehave.

If you own and run a blog or website then being listed by Google is a very important step so it is read by as many people as possible; but what if your website gets Google banned? If this has happened to you, then you know that it hurts your site because you won’t show up in the Google search engine and that means less traffic to your site. Getting unbanned from Google is a long and drawn out process. And sometimes Google won’t even tell you the reason they banned your website in the first place, which doesn’t make things any easier.

Some of the ways a site can be Google banned include having spam on it, putting in too many keywords that clog up your site, making your owned URLs redirect to each other, improperly inserting a robot.txt file, duplicating your own pages and sending people to them over and over, and linking to bad sites like those with adult content, gambling or other unauthorized areas. There are multiple other reasons, so it’s a good idea to try to get them to let you know the reason for being Google banned. That will make it much simpler to fix the problem. Over-optimization has many faces and you can have a look at the Optimization, Over-Optimization or SEO Overkill? Article to get some ideas of practices that you should avoid.

Here are the necessary steps that you need to follow in order to get Google reconsideration for getting unbanned. Be sure to follow Google reconsideration request process precisely and correctly if you want to get your website unbanned and get your site back in business providing whatever products or services that it has:

1 Send an Google Reconsideration Request for getting Unbanned

Getting Google reinclusion of your website requires putting in a Google reconsideration request. First, the way you know your site is Google banned is that suddenly it doesn’t have a page rank on it. Then, in order to determine for sure that this is the case, enter your site at www.yoursite.com into Google, using whatever the name of your site is instead of the words yoursite. If you don’t see any of your pages there, then it’s likely you were Google banned.

Another way to tell if you are truly Google banned is to see if your pages show up in page indexing on Google. Or, if it is a news blog then you can go to www.googlenews.com and if you don’t see your articles there, you will also know you were probably banned from Google and now need to send a Google reconsideration request.

2 Be Polite to Google

Next, make remember that you are sending your Google reconsideration request to a real person who works for Google and someone will actually read your reconsideration request at Google office to be unbanned. Therefore you want to be polite and go into as much detail as possible, as it is better to give too much information than not enough in this situation. Being nice counts in this situation and if you act like a jerk, then it’s likely no one will want to help you.

3 Provide Information about the Domain

List things such as if it was a brand new domain name, tell them some background about your website, and also tell them the rules you think you may have broken. In case, there has been spam click on your account, get to the proofs of the same and write to them about it. This shows them you are serious about resolving the problem when sending the Google reconsideration suggestion. Put down everything that you think someone would need to know in order to know who you are and to jog their memory on why you were banned in the first place. Be sure to do your research so you will understand what his going on and can fully explain it to the Google representatives while sending the reconsideration request to Google.

4 Explain the Solution to the Past Problem

While sending the reconsideration request to Google, tell the representative what you have already done to fix the problem that caused you to be banned. Spell it out in detail and give them your actual page URL to prove it. It’s best to give as much information and data as you can so they will understand what you did to solve the issue. For example, if you had your site linked to bad links, then you must make sure that you remove every one of those and unlink them. Be sure to have removed all spam, or anything else that Google doesn’t approve or like. Then, prove to Google that you did this by showing them the evidence. Or, if you had invalid clicks, which is one of the common reasons to get Google banned, show why the clicks were valid. It takes all this sort of information in detail to make them understand the situation and help you to resolve it. Also, ensure that the changes now made to your website meet the requirements for Google reinclusion. Don’t do even a single thing on your website, which may annoy them.

5 Verify the Website

Next, login to your Google webmaster account and add and verify your site. Then go to http://www.google.com/webmasters/tools/reconsideration. This is the area that you use to put in your reconsideration request to Google to be unbanned. You can also send the information in an email to help@google.com. This is where Google representatives give support to customers. You may have to also sign up for Google webmaster tools once you are logged into your account if you don’t already have it

6 Provide Proof

It’s never a good idea to be an idiot and try to blame Google, or try to say you didn’t have any idea what you did wrong. You need real proof for Google reinclusion, not just blame or acting stupid. Show them the proof of the changes you made for Google reconsideration. And at last, always be considerate and thank them for the time and effort that they are taking to look into your reconsideration request to Google and help you to solve the problems and to get you unbanned from Google so your site can be relisted and you can keep getting the traffic you need to run your business, blog or news site.

7 Be Patient

It can take several weeks for a Google representative to get back with you and answer your Google reinclusion suggestion. They do have a lot of other things to handle and you need to understand that you aren’t the only one who may be having issues. While you are waiting, continue to look over your site and try to make sure all the alleged violations are fixed and good to go.

8 Send Follow Up Email for Google reconsideration

Be sure to send follow-up emails to Google to ask how the request is going and if they know when the situation will be resolved. You probably shouldn’t send one every day, because this could be regarded as you being a pest, but be sure to send one in periodically until you get an answer that you understand and can deal with to solve the Google banned problem.

All in all, it can be a time consuming and complicated process in order for your site to switch your site from Google banned to Google unbanned, but with the proper preparation and information, you should be well on your way to being in their good graces again. However, it’s well worth your efforts, so just follow these steps and Google should get back with you and fix your situation and your site.

Comments (0)
How to Build Backlinks - Tuesday, September 06, 2011

It is out of question that quality backlinks are crucial to SEO success. More, the question is how to get them. While with on-page content optimization it seems easier because everything is up to you to do and decide, with backlinks it looks like you have to rely on others to work for your success. Well, this is partially true because while backlinks are links that start on another site and point to yours, you can discuss with the Web master of the other site details like the anchor text, for example. Yes, it is not the same as administering your own sites – i.e. you do not have total control over backlinks – but still there are many aspects that can be negotiated.

Getting Backlinks the Natural Way

The idea behind including backlinks as part of the page rank algorithm is that if a page is good, people will start linking to it. And the more backlinks a page has, the better. But in practice it is not exactly like this. Or at least you cannot always rely on the fact that your contents is good and people will link to you. Yes, if your content is good and relevant you can get a lot of quality backlinks, including from sites with similar topic as yours (and these are the most valuable kind of backlinks, especially if the anchor text contains your keywords) but what you get without efforts could be less than what you need to successfully promote your site. So, you less than what you need to successfully promote your site. So, you will have to resort to other ways of acquiring quality backlinks as described next.

Ways to Build Backlinks

Even if plenty of backlinks come to your site the natural way, additional quality backlinks are always welcome and the time you spend building them is not wasted. Among the acceptable ways of link building are getting listed in directories, posting in forums, blogs and article directories. The unacceptable ways include inter-linking (linking from one site to another site, which is owned by the same owner or exists mainly for the purpose to be a link farm), linking to spam sites or sites that host any kind of illegal content, purchasing links in bulk, linking to link farms, etc.

The first step in building backlinks is to find the places from which you can get quality backlinks. A valuable assistant in this process is the Backlink Builder tool. When you enter the keywords of your choice, the Backlink Builder tool gives you a list of sites where you can post an article, message, posting, or simply a backlink to your site. After you have the list of potential backlink partners, it is up to you to visit each of the sites and post your content with the backlink to your site in it.

You might wonder why sites as those, listed by the Backlink Builder tool provide such a precious asset as backlinks for free. The answer is simple – they need content for their site. When you post an article, or submit a link to your site, you do not get paid for this. You provide them for free with something they need – content – and in return they also provide you for free with something you need – quality backlinks. It is a free trade, as long as the sites you post your content or links are respected and you don't post fake links or content.

Getting Listed in Directories

If you are serious about your Web presence, getting listed in directories like DMOZ and Yahoo is a must – not only because this is a way to get some quality backlinks for free, but also because this way you are easily noticed by both search engines and potential visitors. Generally inclusion in search directories is free but the drawback is that sometimes you have to wait a couple of months before you get listed in the categories of your choice.

Forums and Article Directories

Generally search engines index forums so posting in forums and blogs is also a way to get quality backlinks with the anchor text you want. If the forum or blog is a respected one, a backlink is valuable. However, in some cases the forum or blog administrator can edit your post, or even delete it if it does not fit into the forum or blog policy. Also, sometimes administrators do not allow links in posts, unless they are relevant ones. In some rare cases (which are more an exception than a rule) the owner of a forum or a blog would have banned search engines from indexing it and in this case posting backlinks there is pointless.

While forum postings can be short and do not require much effort, submitting articles to directories can be more time-consuming because generally articles are longer than posts and need careful thinking while writing them. But it is also worth and it is not so difficult to do.

Content Exchange and Affiliate Programs

Content exchange and affiliate programs are similar to the previous method of getting quality backlinks. For instance, you can offer to interested sites RSS feeds for free. When the other site publishes your RSS feed, you will get a backlink to your site and potentially a lot of visitors, who will come to your site for more details about the headline and the abstract they read on the other site.

Affiliate programs are also good for getting more visitors (and buyers) and for building quality backlinks but they tend to be an expensive way because generally the affiliate commission is in the range of 10 to 30 %. But if you have an affiliate program anyway, why not use it to get some more quality backlinks?

News Announcements and Press Releases

Although this is hardly an everyday way to build backlinks, it is an approach that gives good results, if handled properly. There are many sites (for instance, here is a list of some of them) that publish for free or for a fee news announcements and press releases. A professionally written press release about an important event can bring you many, many visitors and the backlink from a respected site to yours is a good boost to your SEO efforts. The tricky part is that you cannot release press releases if there is nothing newsworthy. That is why we say that news announcements and press releases are not a commodity way to build backlinks.

Backlink Building Practices to Avoid

One of the practices that is to be avoided is link exchange. There are many programs, which offer to barter links. The principle is simple – you put a link to a site, they put a backlink to your site. There are a couple of important things to consider with link exchange programs. First, take care about the ratio between outbound and inbound links. If your outbound links are times your inbound, this is bad. Second (and more important) is the risk that your link exchange partners are link farms. If this is the case, you could even be banned from search engines, so it is too risky to indulge in link exchange programs.

Linking to suspicious places is something else that you must avoid. While it is true that search engines do not punish you if you have backlinks from such places because it is supposed that you have no control over what bad guys link to, if you enter a link exchange program with the so called bad neighbors and you link to them, this can be disastrous to your SEO efforts. For more details about bad neighbors, check the Bad Neighborhood article. Also, beware of getting tons of links in a short period of time because this still looks artificial and suspicious.

Comments (0)
Web Directories and Specialized Search Engines - Tuesday, September 06, 2011

SEO experts spend most of their time optimizing for Google and occasionally one or two other search engines. There is nothing wrong in it and it is most logical, having in mind that topping Google is the lion's share in Web popularity but very often, no matter what you do, topping Google does not happen. Or sometimes, the price you need to pay (not literally but in terms of effort and time) to top Google and keep there is too high. Maybe we should mention here the ultimate SEO nightmare – being banned from Google, when you simply can't use Google (or not at least until you are readmitted to the club) and no matter if you like it or not, you need to have a look about possible alternatives.

What are Google Alternatives

The first alternative to Google is obvious – optimize for the other major search engines, if you have not done it already. Yahoo! and MSN (to a lesser degree) can bring you enough visitors, though sometimes it is virtually impossible to optimize for the three of them at the same time because of the differences in their algorithms. You could also optimize your site for (or at least submit to) some of the other search engines (Lycos, Excite, Netscape, etc.) but having in mind that they altogether hardly have over 3-5% of the Web search traffic, do not expect much.

Another alternative is to submit to search directories (also known as Web directories) and specialized search engines. Search directories might sound so pre-Google but submitting to the right directories might prove better than optimizing for MSN, for example. Specialized search engines and portals have the advantage that the audience they attract consists of people who are interested in a particular topic and if this is your topic, you can get to your target audience directly. It is true that specialized search engines will not bring you as many visitors, as if you were topping Google but the quality of these visitors is extremely high.

Naming all Google alternatives would be a long list and it is outside the scope of this article but just to be a little more precise about what alternatives exist, we cannot skip SEO instruments like posting to blogs and forums or paid advertisements.

Web Directories

What is a Web Directory?

Web directories (or as they are better known – search directories) existed before the search engines, especially Google, became popular. As the name implies, web directories are directories where different resources are gathered. Similarly to desktop directories, where you gather files in a directory based on some criterion, Web directories are just enormous collections of links to sites, arranged in different categories. The sites in a Web directory are listed in some order (most often alphabetic but it is not necessarily so) and users browse through them.

Although many Web directories offer a search functionality of some kind (otherwise it will be impossible to browse thousands of pages for let's say Computers), search directories are fundamentally different from search engines in the two ways – most directories are edited by humans and URLs are not gathered automatically by spiders but submitted by site owners. The main advantage of Web directories is that no matter how clever spiders become, when there is a human to view and check the pages, there is a lesser chance that pages will be classified in the wrong categories. The disadvantages of the first difference are that the lists in web directories are sometimes outdated, if no human was available to do the editing and checking for some time (but this is not that bad because search engines also deliver pages that do not exist anymore) and that sometimes you might have to wait half an year before being included in a search directory.

The second difference – no spiders – means that you must go and submit your URL to the search directory, rather than sit and wait for the spider to come to your site. Fortunately, this is done only once for each directory, so it is not that bad.

Once you are included in a particular directory, in most cases you can stay there as long as you wish to and wait for people (and search engines) to find you. The fact that a link to your site appears in a respectable Web directory is good because first, it is a backlink and second, you increase your visibility for spiders, which in turn raises your chance to be indexed by them.

Examples of Web Directories

There are hundreds and thousands of search directories but undoubtedly the most popular one is DMOZ. It is a general purpose search directory and it accepts links to all kinds of sites. Other popular general-purpose search directories are Google Directory and Yahoo! Directory. The Best of the Web is one of the oldest Web directories and it still keeps to high standards in selecting sites.

Besides general-purpose Web directories, there are incredibly many topical ones. For instance, the The Environment Directory lists links to environmental sites only, while The Radio Directory lists thousands of radio stations worldwide, arranged by country, format, etc. There are also many local and national Web directories, which accept links to sites about a particular region or country only and which can be great if your site is targeted at local and national audience only. You see, it is not possible to mention even the topics of specialized search directories only because the list will get incredibly long. Using Google and specialized search resources like The Search Engines Directory, you can find on your own many directories that are related to your area of interest.

Specialized Search Engines

What is a Specialized Search Engine?

Specialized search engines are one more tool to include in your SEO arsenal. Unlike general-purpose search engines, specialized search engines index pages for particular topics only and very often there are many pages that cannot be found in general-purpose search engines but only in specialized ones. Some of the specialized search engines are huge sites that actually host the resources they link to, or used to be search directories but have evolved to include links not only to sites that were submitted to them. There are many specialized search engines for every imaginable topic and it is always wise to be aware of the specialized search engines for your niche. The examples in the next section are by no means a full list of specialized search engines but are aimed to give you the idea of what is available. If you search harder on the Web, you will find many more resources.

Examples of Specialized Search Engines

Probably specialized search engines are not that numeric as Web directories but still certainly there is no shortage of them either, especially if one counts password-protected sites with database accessible only from within the site as a specialized search engine. As with Web directories, if there were a list of specialized search engines it would be really, really long (and constantly changing), so instead, here are some links to lists of search engines: Pandia Powersearch, Webquest, Virtual Search Engines, the already mentioned The Search Engines Directory, etc. What is common for these lists is that they offer a selection of specialized search engines, arranged by topic, so it is a good starting point for the hunt of specialized search engines.
Comments (0)
Optimizing for MSN - Tuesday, September 06, 2011

SEO experts often forget that there are three major search engines. While there is no doubt that Google is the number one with the most searches and Yahoo! manages to get about a quarter of the market, MSN has not retired yet. It holds about 10-15 percent of the searches (according to some sources even less – about 5%) but it has a loyal audience that can't be reached through the other two major search engines, so if you plan a professional SEO campaign, you can't afford to skip MSN. In a sense getting high rankings in MSN is similar to getting high rankings for less popular keywords – because competition is not that tough you might be able to get enough visitors from MSN only in comparison to the case when you have optimized for a more popular search engine.

Although optimizing for MSN is different from optimizing for Google and Yahoo!, there are still common rules that will help you to rank high in any search engine. As a rule, if you rank well in Google, chances are that you will rank well in Yahoo! (if you are interested in the tips and tricks for optimizing for Yahoo!, you want to have a look at the Optimizing for Yahoo! Article) and MSN as well. The opposite is not true, however. If you rank well in MSN, there is no guarantee that you'll do the same in Google. So, when you optimize for MSN, keep an eye on your Google ranking as well. It's no good to top MSN and be nowhere in Google (the opposite is more acceptable, if you need to make the choice).

But why is this so? The answer is simple - the MSN algorithm is different and that is why, even if the same pages were indexed, the search results will vary.

The MSN Algorithm

As already mentioned, it is the different MSN algorithm that leads to such drastic results in ranking. Otherwise, MSN, like all search engines, first spiders the pages on the Web, then indexes them in its database and after that applies the algorithm to generate the pages with the search results. So, the first step in optimizing for MSN is the same as for the other search engines – to have a spiderable site. (Have a look at Search Engine Spider Simulator to see how spiders see your site). If your site is not spiderable, then you don't have even a hypothetical chance to top the search results.

There is quite a lot of speculation about the MSN algorithm. Looking at the search results MSN delivers, it is obvious that its search algorithm is not as sophisticated as Google's, or even Yahoo!'s and many SEO experts agree that the MSN search algorithm is years behind its competitors. So, what can you do in this case? Optimize as you did for Google a couple of years ago? You are not far from the truth, though actually is is not that simple.

One of the most important differences is that MSN still relies heavily on metatags, as explained below. None of the other major search engines uses metatags that heavily anymore. It is obvious that metatags give SEO experts a great opportunity for manipulating search results. Maybe metatags are the main reason for the inaccurate search results that MSN often produces.

The second most important difference between MSN and the other major search engines is their approach to keywords. Well, for MSN keywords are very, very important, too, but unlike Google, for MSN onpage factors are dominating, while offpage factors (like backlinks for example), are still of minor importance. Well, it is a safe bet that the importance of backlinks will be changed in the future but for now they are not a primary factor for high rankings.

Keywords, Keywords, Keywords

It is hardly surprising that keywords are the most important item for MSN. What is surprising is that MSN relies too much on them. It is very easy to fool MSN – just artificially inflate your keyword density, put a couple of keywords in file names (and even better – in domain names) and around the top of the page and you are almost done for MSN. But if you do the above-mentioned black hat practices, your joy of topping MSN will not last long because, unless you provide separate pages that are optimized for Google, your stuffed pages might pretty well get you banned from Google. If you decide to have separate pages for Google and MSN, first, it it hardly worth the trouble, and second, the risk of duplicate content penalty can't be ignored.

So, what is the catch? The catch is that if you try to polish your site for MSN and stuff it with keywords, this might get you into trouble with Google, which certainly is worse than not ranking well in MSN. But if you optimize wisely, it is more likely than not that you will rank decently in Google and perform well in Yahoo! and MSN as well.

Metatags

Having meaningful metatags never hurts but with MSN this is even more important because its algorithm still uses them as a primary factor in calculating search results. Having well-written (not stuffed) metatags will help you with MSN and some other minor search engines, while at the same time well-written metatags will not get you banned from Google.

The Description metatag is very important:

<META NAME=”Description” CONTENT=”Place your description here” />

MSNBot reads its content and based on that (in addition to keywords found on page) judges how to classify your site. So if you leave this tag empty (i.e. CONTENT=””), you have missed a vital chance to be noticed by MSN. There is no evidence that MSN uses the other metatags in its algorithm that is why leaving the Description metatag empty is even more unforgivable.

Comments (0)
How to get Traffic from Social Bookmarking sites - Tuesday, September 06, 2011

Sites like digg.com, reddit.com, stumbleupon.com etc can bring you a LOT of traffic. How about getting 20,000 and more visitors a day when your listing hits the front page?
Getting to the front page of these sites is not as difficult as it seems. I have been successful with digg and del.icio.us (and not so much with Reddit though the same steps should apply to it as well) multiple times and have thus compiled a list of steps that have helped me succeed:


 

1Pay attention to your Headlines

Many great articles go unnoticed on social bookmarking sites because their headline is not catchy enough. Your headline is the first (and very often the only) thing users will see from your article, so if you don't make the effort to provide a catchy headline, your chances of getting to the front page are small.
Here are some examples to start with :-

Original headline : The Two Types of Cognition
Modified Headline : Learn to Understand Your Own Intelligence

Original headline: Neat way to organize and find anything in your purse instantly!
Modified Headline : How to Instantly Find Anything in Your Purse

Here is a good blog post that should help you with your headlines.

2Write a meaningful & short description

The headline is very important to draw attention but if you want to keep that attention, a meaningful description is vital. The description must be slightly provocative because this draws more attention but still, never use lies and false facts to provoke interest. For instance, if your write “This article will reveal to you the 10 sure ways to deal with stress once and forever and live like a king from now on.” visitors will hardly think that your story is true and facts-based.

You also might be tempted to use a long tell-it-all paragraph to describe your great masterpiece but have in mind that many users will not bother to read anything over 100-150 characters. Additionally, some of the social bookmarking sites limit descriptions, so you'd better think in advance how to describe your article as briefly as possible.

3Have a great first paragraph

This is a rule that is always true but for successful social bookmarking it is even more important. If you have successfully passed Level 1 (headlines) and Level 2 (description) in the Catch the User's Attraction game, don't let a bad first paragraph make them leave your site.

4Content is king

However, the first paragraph is not everything. Going further along the chain of drawing (and retaining) users' attention, we reach the Content is King Level. If your articles are just trash, bookmarking them is useless. You might cheat users once but don't count on repetitive visits. What is more, you can get your site banned from social bookmarking sites, when you persistently post junk.

5Make it easy for others to vote / bookmark your site

It is best when other people, not you, bookmark your site. Therefore, you must make your best to make it easier for them to do it. You can put a bookmarking button at the end of the article, so if users like your content, they can easily post it. If you are using a CMS, check if there is an extension that allows to add Digg, Del.icio.us, and other buttons but if you are using static HTML, you can always go to the social bookmarking site and copy the code that will add their button to your pages.
Here is a link that should help you add Links for Del.icio.us, Digg, and More to your pages.

6Know when to submit

The time when you submit can be crucial for your attempts to get to the front page. On most social bookmarking sites you have only 24 hours to get to the front page and stay there. So, if you post when most users (and especially your supporters) are still sleeping, you are wasting valuable time. By the time they get up, you might have gone to the tenth page. You'd better try it for yourself and see if it works for you but generally posting earlier than 10 a.m. US Central Time is not good. Many people say that they get more traffic around 3 p.m. US Central Time. Also, workdays are generally better in terms of traffic but the downside is that you have more competitors for the front page than on weekends.

7Submit to the right category

Sometimes a site might not work for you because there is no right category for you. Or because you don't submit to the right category – technology, health, whatever – but to categories like General, Miscellaneous, etc. where all unclassified stuff goes. And since these categories fill very fast, your chance to get noticed decreases.

8Build a top-profile

Not all users are equal on social bookmarking sites. If you are an old and respected user who has posted tons of interesting stuff, this increases the probability that what you submit will get noticed. Posting links to interesting articles on other sites is vital for building a top-profile. Additionally, it is suspicious, when your profile has links to only one site. Many social bookmarking sites frown when users submit their own content because this feels like self-promotion.

9Cooperate with other social bookmarkers

The Lonely Wolf is a suicidal strategy on sites like StubleUpon, Digg, Netscape. Many stories make it to the front page not only because they are great but because they are backed up by your network of friends. If in the first hours after your submittal you get at least 15 votes from your friends and supporters, it is more likely that other users will vote for you. 50 votes can get you to the top page of Digg.

10Submit in English

Linguistic diversity is great but the majority of users are from English-speaking countries and they don't understand exotic languages. So, for most of the social bookmarking sites submitting anything in a language different from English is not recommendable. The languages that are at an especial disadvantage are Chinese, Arabic, Slavic languages and all the other that use non-latin alphabet. German, Spanish, French are more understandable but still they are not English. If you really must submit your story (i.e. because you need the backlink), include an English translation at least of the title. But the best way to proceed with non-English stories is to post them on where they belong. Check this link for a list of non-English sites.

11Never submit old news

Submitting old news will not help you in becoming a respected user. Yesterday's news is history. But if you still need to submit old stuff, consider feature articles, howtos and similar pieces that are up-to-date for a long time.

12Check your facts

You must be flattered that users read your postings but you will hardly be flattered when users prove that you haven't got the facts right. In addition to sarcastic comments, you might also receive negative votes for your story, so if you want to avoid this, check you facts - or your readers will do it.

13Check you spelling

Some sites do not allow to edit your posts later, so if you misspell the title, the URL, or a keyword, it will stay this way forever.

14Not all topics do well

But sometimes even great content and submitting to the right category do not push you to the top. One possible reason could be that your stories are about unpopular topics. Many sites have topics that their users love and topics that don't sell that well. For instance, Apple sells well on Digg and The War in Iraq on Netscape. Negative stories - about George Bush, Microsoft, evil multinational companies, corruption and crime also have a chance to make it to the front page. You can't know these things in advance but some research on how many stories tagged with keywords like yours have made the front page in the last year or so can give you a clue.

15Have Related Articles / Popular Articles

Traffic gurus joke that traffic from social bookmarking sites is like an invasion – the crowds pour in and in a day or two they are gone. Unfortunately this is true – after your listing rolls from the front page (provided that you reached the front page), the drop in traffic is considerable. Besides, many users come just following the link to your article, have a look at it and then they are gone. One of the ways to keep them longer on your site is to have links to Related Articles / Popular Articles or something similar that can draw their attention to other stuff on the site and make them read more than one article.

16RSS feeds, newsletter subscriptions, affiliate marketing

RSS feeds, newsletter subscriptions, affiliate marketing are all areas in which the traffic from social bookmarking sites can help you a lot. Many people who come to your site and like it, will subscribe to RSS feeds and/or your newsletter. So, you need to put these in visible places and then you will be astonished at the number of new subscriptions you got on the day when you were on the front page of a major social bookmarking site.

17Do not use automated submitters

After some time of active social bookmarking, you will discover that you are spending hours on end posting links. Yes, this is a lot of time and using automated submitters might look like the solution but it isn't. Automated submitters often have malware in them or are used for stealing passwords, so unless you don't care about the fate of your profile and don't mind being banned, automated submitters are not the way to go.

18Respond to comments on your stories

Social bookmarking sites are not a newsgroup but interesting articles can trigger a pretty heated discussion with hundreds of comments. If your article gets comments, you must be proud. Always respond to commends on your stories and even better – post comments on other stories you find interesting. This is a way to make friends and to create a top-profile.

19Prepare your server for the expected traffic

This is hardly a point of minor importance but we take for granted that you are hosting your site on a reliable server that does not crash twice a day. But have in mind that your presence on the front page of a major social bookmarking site can drive you a lot traffic, which can cause your server to crash – literally!
I remember one of the times I was on the front page on Digg, I kept restarting Apache on my dedicated server because it was unable to cope with the massive traffic. I have many tools on my site and when the visitors tried them, this loaded the server additionally.
Well, for an articles site getting so much traffic is not so devastating but if you are hosting on a so-so server, you'd better migrate your site to a machine that can handle a lot of simultaneous hits. Also, check if your monthly traffic allowance is enough to handle 200-500,000 or even more visitors. It is very amateurish to attract a lot of visitors and not be able to serve them because your server crashed or you have exceeded your bandwidth!

20The snowball effect

But despite the differences in the likes of the different social bookmarking communities, there are striking similarities. You will soon discover that if a post is popular on one of the major sites, this usually drives it up on the other big and smaller sites. Usually it is Digg posts that become popular on StumbleUpon and Reddit but there are many other examples. To use this fact to your best advantage, you may want to concentrate your efforts on getting to the front page of the major players only and bet on the snowball effect to drive you to the top on other sites.
An additional benefit of the snowball effect is that if your posting is interesting and people start blogging about it, you can get tons of backlinks from their blogs. This happened to me and the result was that my PR jumped to 6 on the next update.

Comments (0)
SEO Careers during a Recession - Tuesday, September 06, 2011

I don't know if many people became SEO experts because they planned ahead and thought that SEO careers are relatively stable in the long run, especially when compared to other business areas, or the reasons to make a career in SEO were completely different, but my feeling is that SEO experts are lucky now. Why? Because while the recession makes many industries wrench in pain, many SEO professionals are in top financial shape and full of optimism for the future.

It will be an exaggeration to say that the SEO industry doesn't feel the recession. This is not exactly so but when compared to industries such as automobiles, newspapers, banking, real estate, etc., SEO looks like a coveted island of financial security. This doesn't mean that there is no drop in volumes and everybody in SEO is working for top dollar but as a whole the SEO industry and the separate individuals, who make their living in SEO, are much better than many other employees and entrepreneurs.

What Can You Expect from Your SEO Career During a Recession?

The question about what realistic expectations are is fundamental. I bet there are people in SEO, who are not very happy with their current situation and blame the recession for that. Well, if most of your clients were from troubled industries (cars, real estate, financial services, etc.), then you do have a reason to complain. In such cases you should be happy if you can pay the bills. What you can do (if you haven't already done it) is to look for new customers from other industries.

Another factor that influences your expectations about your SEO career during the recession is your position on the career ladder. It makes a big difference if you work for a company or you are your own boss. Being an employee has always been a more vulnerable position, so if you expect job security, this is easier to achieve when you ares an independent SEO contractor. Mass layoffs might not the common for SEO companies but hired workers are never immune against it.

Additionally, your skill level also affects how your SEO carer will be influenced by the recession. The recession is not the right time for novices to enter SEO. Many people from other industries rush to SEO as a life belt. When these people don't have the right skills and expertise but expect rivers of gold, this inevitably leads to disappointment.

What Makes SEO Careers Recession-Proof?

So, if you are a seasoned SEO practitioner and you don't dream of rivers of gold, you can feel safe with SEO because unlike careers in many other industries SEO careers are relatively recession-proof. Here are some of the reasons why SEO careers are recession-proof:

  • The SEO market is an established market. If you remember the previous recession from the beginning of the century, when the IT industry was among the most heavily stricken, you might be skeptical a bit that now it won't be the same story. No, it is not the same now. SEO is not a new service anymore and the SEO market itself is more established than it was a couple of years ago. This is what makes the present recession different from the previous one – the difference is fundamental and it can't be neglected.

  • SEO is one of the last expenses companies cut. SEO has already become a necessity for companies of any size. Unlike hardware, cars, not to mention entertainment and even business trips, SEO expenses are usually not that big but they help a company to stay aboard. That is why when a company decides to make cuts in the budget, SEO expenses are usually not among the things that get the largest cut (or any cut at all).

  • SEO has great ROI. The Return On Investment (ROI) for money spent on SEO is much higher than the ROI for other types of investments. SEO brings companies money and this is what makes it such a great investment. Stop SEO and the money stops coming as well.

  • Many clients start aggressive SEO campaigns in an attempt to get better results fast. During a recession SEO is even more important. That is why many clients decide that an aggressive SEO campaign will help them get more clients and as a result these clients double their pre-recession budgets.

  • SEO is cheaper than PPC. SEO is just one of the many ways for a site to get traffic. However, it is also one of the most effective ways to drive tons of traffic. For instance, if you consider PPC, the cost advantages of SEO are obvious. PPC is very expensive and as a rule, ranking high in organic search results even for competitive keywords is cheaper than PPC.

  • Cheaper than traditional promotion methods. Traditional promotion methods (i.e. offline marketing) are still an option but their costs are higher even than PPC and the other forms of online promotion. Besides many companies have given offline marketing completely and have turned to SEO as their major way to promote their business and attract new clients.

  • SEO is an recurring expense. Many businesses build their business model around memberships and other forms of recurring payments. For you memberships and other types of recurring payments are presold campaigns – i.e. more or less you know that if the client is happy with a campaign you did for him, he or she will return. Acquiring recurring clients is very beneficial because you have less expenses in comparison to acquiring clients one by one.

The outlook for SEO careers during times of recession is pretty positive. As we already mentioned, it is possible to experience drops in volumes or some of your clients to go the bankruptcy road but as a whole SEO offers more stability than many other careers. If you manage to take advantage of the above mentioned recession-proof specifics of SEO and you are a real professional, you won't have the pleasure to feel the recession in all its bitterness.

Comments (0)
Top 10 Costly Link Building Mistakes - Tuesday, September 06, 2011

Link building is one of the most important SEO activities but this certainly doesn't mean that you should build links at any price – literally and figuratively. Link building can be very expensive in terms of time and money. There are many costly link building mistakes and here are some of the most common:

1 Check if backlinks have a “nofollow” attribute

Link exchanges are still one of the white hat ways to build backlinks but unfortunately, there are many unscrupulous webmasters, who will cheat you. One of the scams is when you pay somebody for a backlink, it suddenly disappears or has the “nofollow” attribute. That is why you should check from time to time if the link is still there and if it doesn't have the “nofollow” attribute.

2 Getting good quality links but with useless anchor texts

It is great when PR of the site you are getting links from is high but when the anchor text is “Click here!” or something like that, such a link is barely useful. Keywords in the anchor text are vital, so if the backlink doesn't have them, it isn't a valuable one. Analyzing the anchor texts of links takes time but the Backlink Anchor Text Analyzer tool can do the hard job for you.

3 Getting an image link (when a text link with keyword is possible)

Sometimes when web masters hurry to get backlinks, they skip minor details, such as anchor text. Yes, an image link is great and it could even bring you more visitors than a text link (if the image is attractive, of course and users click it) but for SEO purposes nothing beats a keyword in the anchor text.

4 Not using ALT text if image link is the only possibility

Image links might be the worse option than text links but if an image link is the only possibility to get a backlink, don't reject it. However, make sure that the ALT text of the image link has your keywords – this is more than nothing.

5 Getting backlinks from irrelevant websites

Now, this mistake is really a popular one! When hunting for backlinks, you should concentrate on relevant sites only. If you have a dating site, getting links from a finance one is not valuable. It is true that it is not easy to find relevant sites to get links from but unless your site is in a very narrow niche, chances are that there are hundreds or even thousands of relevant sites you can get a backlink from. If you need a list of such sites for your niche, try the Backlink Builder and see what suggestions it can give you.

6 Getting backlinks from sites/pages with tons of links

A backlink is more valuable, if it comes from a page, which is not cluttered with tons of other backlinks. Many pages have 200, or more links and if your link is one of them, this isn't a great achievement. On the other hand, many directories put the “nofollow” attribute on nonpaid links, so actually even if there are 200 links on page and most of them are “nofollow” (but yours isn't), this still counts.

7 Links from pages spiders can't crawl

A link might look perfectly legitimate (i.e. keywords in the anchor text and no “nofollow” attribute) and still it might not be a link. This is especially an issue with link exchanges because you put a link to the other site but the other site doesn't do the same for you. Links Google can't index can be placed on dynamic pages or simply on pages, which are not indexed by Google because robots.txt bans it. That is why it doesn't hurt to check from time if the pages your links are placed on are accessible to spiders. The Search Engine Spider Simulator tool can help you do this in no time at all.

8 Explicitly selling links

There is hardly a web master who hasn't heard that paid links can hurt your rankings but still many web masters don't miss the chance to make a few bucks. If you really want to sell links, you'd better use the specialized link selling services, such as Backlinks.com because they are more discreet. However, have in mind that while some of the paid links networks try to hide the fact that the links are paid, the rest are not that discreet. Also, maybe the worst gaffe you can make is to include phrases in website like “Buy 5 PR links for $10”or any other hint that you are selling links. You can include “Advertise here!” or similar messages and still de facto sell paid links but this is not as explicit as listing your prices for links.

9 Linking to sites with poor reputation

Linking to sites with poor reputation, also known as “bad neighbors” is one of the worst mistakes you can make. When you link to such sites, for Google this means that you endorse them and this results in penalties for you. That is why you must absolutely always check the sites (and their reputation) first before you link to them. Even if you are offered a lot of money to link to a site with poor reputation, you'd better decline the offer because otherwise your rating with search engines will suffer and this will cause you a lot of problems.

10 Linking to good sites gone bad

Even if you check carefully the sites you link to, sometimes it happens that a site, which used to be more or less decent all of a sudden starts publishing porn ads or other objectionable content. That is why it doesn't hurt if you check not only that the outbound links you have are not broken but also where they lead to.

Links are very important and that is why you should pay attention to what links you are getting. It is not a waste of time to monitor what's going on with your links and in addition to the tools listed in the article, you can also try the Backlink Summary tool.

Comments (0)
How to Pick an SEO Friendly Designer - Monday, September 05, 2011
It is very important to hire a SEO-friendly designer because if you don't and your site is designed in a SEO-unfriendly fashion, you can't compensate for this later. This article will tell you how to pick a SEO-friendly designer and save yourself the disappointment of low rankings with search engines.

A Web designer is one of the persons without whom it is not possible to create a site. However, when SEO is concerned, Web designers can be really painful to deal with. While there are many Web designers, who are SEO-proficient, it is still not an exception to stumble upon design geniuses, who are focused only on the graphic aspect of the site. For them SEO is none of their business and they couldn't care less for something as unimportant as good rankings with search engines. Needless to say, if you hire such a designer, don't expect that your site will rank well with search engines.

If you will do SEO on your own, then you might not care a lot about the SEO skills of your Web designer but still there are design issues as we'll see next, which can affect your rankings very badly. When he or she designs the site against SEO rules, then it is not possible to fix this with SEO tricks.

When we say that you need to hire a SEO-friendly designer, we presume that you are a SEO pro and you know SEO but if you aren't, then have a look at the SEO Tutorial and the SEO Checklist. If you have no idea about SEO, then you will hardly be able to select a SEO-friendly designer because you won't know what to look for.

One of the ultimate tests if a designer is SEO-friendly or not is to look at his or her past sites – are they done professionally, especially in the SEO department. If their past sites don't exhibit blatant SEO mistakes, such as the ones we'll list in a second and they rank well, this is a recommendation that this person is worth hiring. Anyway, after you look at past sites, ask the designer if he or she did the SEO for their past sites because in some cases it might be that the client himself or herself has done a lot to optimize the site and this is why the site ranks well.

Here is a checklist of common web design sins that will make your site a SEO disaster. If you notice any or all of the following in the past sites your would-be designer has created, just move to the next designer. These SEO-unfriendly design elements are absolute sins and unless the client made them do it, no designer who would use the below techniques deserves your attention:

1 Rely heavily on Flash

Many designers still believe that Flash is the next best thing after sliced bread. While Flash can be very artistic and make a site look cool (and load forever in the browser), heavily Flash-ed sites are disaster in terms of SEO. Simple HTML sites rank better with search engines and as we point out in Optimizing Flash Sites, if the use of Flash is a must, then an HTML version of the same page is more than mandatory.

2 No internal links, or very few links

Internal links are backlinks and they are very important. Of course, this doesn't mean that all the text on a page must be hyperlinked to all the other pages on the site but if there are only a couple of internal links a page, this is a missed chance to get backlinks.

3 Images, not text for anchors

This is another frequent mistake many designers make. Anchor text is vital in SEO and when your links lack anchor text, this is bad. It is true that for menu items and other page elements, it is much easier to use an image than text because with text you can never be sure it will display correctly on users' screens, but since this is impacting your site's rankings in a negative way, you should sacrifice beauty for functionality.

4 Messy code and tons of code

If you have no idea about HTML, then it might be impossible for you to judge if a site's code is messy and if the amount of code is excessive but cleanness of code is an important criterion for SEO. When the code is messy, it might not be spiderable at all and this can literally exclude your site from search engines because they won't be able to index it.

5 Excessive use of (SEO non-friendly) JavaScript

Similarly to Flash, search engines don't love JavaScript, especially tons of it. Actually, the worst with JavaScript is that if not coded properly, it is quite possible that because of the use of JavaScript your pages (or parts of them) are not spiderable, which automatically means that they won't be indexed.

6 Overoptimized sites

Overoptimized sites aren't better than under-optimized. In fact, they could be much worse because when you keyword stuff and use other techniques (even when they are not Black Hat SEO) to artificially inflate the rankings of the site, this could get you banned from search engines and this is the worst that can happen to a site.

7 Dynamic and other SEO non-friendly URLs

Well, maybe dynamic URLs is not exactly a design issue but if you are getting a turn-key site - i.e. it is not up to you to upload and configure it and to create the links inside - then dynamic URLs are bad and you have to ask the designer/developer not to use them. You can rewrite dynamic and other SEO non-friendly URLs on your own but actually this means to make dramatic changes to the site and this is hardly the point of hiring a designer.

These points are very important and this is why you need to follow them, when you are choosing a SEO-friendly designer. Some of the items on the list are so bad for SEO (i.e. Flash, JavaScript) that even if the site is a design masterpiece and you promote it heavily, you will still be unable to get decent rankings. SEO-friendliness of design is a necessity, not a whim and you shouldn't settle for a SEO-unfriendly designs – this can be really expensive!

Comments (0)
LiveZilla Live Help